340 research outputs found

    Economics of payment cards: a status report

    Get PDF
    This article surveys the recent theoretical literature on payment cards (focusing on debit and credit cards) and studies this research's possible implications for the current public policy debate over payment card networks and the pricing of their services for both consumers and merchants.Payment systems ; Credit cards

    Expressing coherence of musical perception in formal logic

    Get PDF
    Formal logic can be used for expressing certain aspects of musical coherence. In this paper, a framework is developed which aims at linking. expressions in the formal language to an underlying interpretation in terms of musical images and image transformations. Such an interpretation characterizes truth within a framework of spatio-temporal representations and perception-based musical information processing. The framework provides a way for defining a semantics for the coherence of musical perception

    What does touch tell us about emotions in touchscreen-based gameplay?

    Get PDF
    This is the post-print version of the Article. The official published version can be accessed from the link below - Copyright @ 2012 ACM. It is posted here by permission of ACM for your personal use. Not for redistribution.Nowadays, more and more people play games on touch-screen mobile phones. This phenomenon raises a very interesting question: does touch behaviour reflect the player’s emotional state? If possible, this would not only be a valuable evaluation indicator for game designers, but also for real-time personalization of the game experience. Psychology studies on acted touch behaviour show the existence of discriminative affective profiles. In this paper, finger-stroke features during gameplay on an iPod were extracted and their discriminative power analysed. Based on touch-behaviour, machine learning algorithms were used to build systems for automatically discriminating between four emotional states (Excited, Relaxed, Frustrated, Bored), two levels of arousal and two levels of valence. The results were very interesting reaching between 69% and 77% of correct discrimination between the four emotional states. Higher results (~89%) were obtained for discriminating between two levels of arousal and two levels of valence

    Interpersonal sensorimotor communication shapes intrapersonal coordination in a musical ensemble

    Get PDF
    Social behaviors rely on the coordination of multiple effectors within one’s own body as well as between the interacting bodies. However, little is known about how coupling at the interpersonal level impacts coordination among body parts at the intrapersonal level, especially in ecological, complex, situations. Here, we perturbed interpersonal sensorimotor communication in violin players of an orchestra and investigated how this impacted musicians’ intrapersonal movements coordination. More precisely, first section violinists were asked to turn their back to the conductor and to face the second section of violinists, who still faced the conductor. Motion capture of head and bow kinematics showed that altering the usual interpersonal coupling scheme increased intrapersonal coordination. Our perturbation also induced smaller yet more complex head movements, which spanned multiple, faster timescales that closely matched the metrical levels of the musical score. Importantly, perturbation differentially increased intrapersonal coordination across these timescales. We interpret this behavioral shift as a sensorimotor strategy that exploits periodical movements to effectively tune sensory processing in time and allows coping with the disruption in the interpersonal coupling scheme. As such, head movements, which are usually deemed to fulfill communicative functions, may possibly be adapted to help regulate own performance in time

    Analysis of movement quality in full-body physical activities

    Get PDF
    Full-body human movement is characterized by fine-grain expressive qualities that humans are easily capable of exhibiting and recognizing in others' movement. In sports (e.g., martial arts) and performing arts (e.g., dance), the same sequence of movements can be performed in a wide range of ways characterized by different qualities, often in terms of subtle (spatial and temporal) perturbations of the movement. Even a non-expert observer can distinguish between a top-level and average performance by a dancer or martial artist. The difference is not in the performed movements-the same in both cases-but in the \u201cquality\u201d of their performance. In this article, we present a computational framework aimed at an automated approximate measure of movement quality in full-body physical activities. Starting from motion capture data, the framework computes low-level (e.g., a limb velocity) and high-level (e.g., synchronization between different limbs) movement features. Then, this vector of features is integrated to compute a value aimed at providing a quantitative assessment of movement quality approximating the evaluation that an external expert observer would give of the same sequence of movements. Next, a system representing a concrete implementation of the framework is proposed. Karate is adopted as a testbed. We selected two different katas (i.e., detailed choreographies of movements in karate) characterized by different overall attitudes and expressions (aggressiveness, meditation), and we asked seven athletes, having various levels of experience and age, to perform them. Motion capture data were collected from the performances and were analyzed with the system. The results of the automated analysis were compared with the scores given by 14 karate experts who rated the same performances. Results show that the movement-quality scores computed by the system and the ratings given by the human observers are highly correlated (Pearson's correlations r = 0.84, p = 0.001 and r = 0.75, p = 0.005)

    Multi-score Learning for Affect Recognition: the Case of Body Postures

    Get PDF
    An important challenge in building automatic affective state recognition systems is establishing the ground truth. When the groundtruth is not available, observers are often used to label training and testing sets. Unfortunately, inter-rater reliability between observers tends to vary from fair to moderate when dealing with naturalistic expressions. Nevertheless, the most common approach used is to label each expression with the most frequent label assigned by the observers to that expression. In this paper, we propose a general pattern recognition framework that takes into account the variability between observers for automatic affect recognition. This leads to what we term a multi-score learning problem in which a single expression is associated with multiple values representing the scores of each available emotion label. We also propose several performance measurements and pattern recognition methods for this framework, and report the experimental results obtained when testing and comparing these methods on two affective posture datasets

    perceiving animacy and arousal in transformed displays of human interaction

    Get PDF
    When viewing a moving abstract stimulus, people tend to attribute social meaning and purpose to the movement. The classic work of Heider and Simmel [1] investigated how observers would describe movement of simple geometric shapes (circle, triangles, and a square) around a screen. A high proportion of participants reported seeing some form of purposeful interaction between the three abstract objects and defining this interaction as a social encounter. Various papers have subsequently found similar results [2,3] and gone on to show that, as Heider and Simmel suggested, the phenomenon was due more to the relationship in space and time of the objects, rather than any particular object characteristic. The research of Tremoulet and Feldman [4] has shown that the percept of animacy may be elicited with a solitary moving object. They asked observers to rate the movement of a single dot or rectangle for whether it was under the influence of an external force, or whether it was in control of its own motion. At mid-trajectory the shape would change speed or direction, or both. They found that shapes that either changed direction greater than 25 degrees from the original trajectory, or changed speed, were judged to be "more alive" than others. Further discussion and evidence of animacy with one or two small dots can be found in Gelman, Durgin and Kaufman [5] Our aim was to further study this phenomenon by using a different method of stimulus production. Previous methods for producing displays of animate objects have relied either on handcrafted stimuli or on parametric variations of simple motion patterns. It is our aim to work towards a new automatic approach by taking actual human movements, transforming them into basic shapes, and exploring what motion properties need to be preserved to obtain animacy. Though the phenomenon of animacy has been shown for many years, using various different displays, very few specific criteria have been set on the essential characteristics of the displays. Part of this research is to try and establish what movements result in percepts of animacy, and in turn, to give further understanding of essential characteristics of human movement and social interaction. In this paper we discuss two experiments in which we examine how different transformations of an original video of a dance influences perception of animacy. We also examine reports of arousal, Experiment 1, and emotional engagement in Experiment 2

    Bridging the gap between emotion and joint action

    Get PDF
    Our daily human life is filled with a myriad of joint action moments, be it children playing, adults working together (i.e., team sports), or strangers navigating through a crowd. Joint action brings individuals (and embodiment of their emotions) together, in space and in time. Yet little is known about how individual emotions propagate through embodied presence in a group, and how joint action changes individual emotion. In fact, the multi-agent component is largely missing from neuroscience-based approaches to emotion, and reversely joint action research has not found a way yet to include emotion as one of the key parameters to model socio-motor interaction. In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences. We propose an integrative approach to bridge the gap, highlight five research avenues to do so in behavioral neuroscience and digital sciences, and address some of the key challenges in the area faced by modern societies

    Go-with-the-flow: Tracking, Analysis and Sonification of Movement and Breathing to Build Confidence in Activity Despite Chronic Pain

    Get PDF
    Chronic (persistent) pain (CP) affects one in ten adults; clinical resources are insufficient, and anxiety about activity restricts lives. Technological aids monitor activity but lack necessary psychological support. This paper proposes a new sonification framework, Go-with-the-Flow, informed by physiotherapists and people with CP. The framework proposes articulation of user-defined sonified exercise spaces (SESs) tailored to psychological needs and physical capabilities that enhance body and movement awareness to rebuild confidence in physical activity. A smartphone-based wearable device and a Kinect-based device were designed based on the framework to track movement and breathing and sonify them during physical activity. In control studies conducted to evaluate the sonification strategies, people with CP reported increased performance, motivation, awareness of movement and relaxation with sound feedback. Home studies, a focus group and a survey of CP patients conducted at the end of a hospital pain management session provided an in-depth understanding of how different aspects of the SESs and their calibration can facilitate self-directed rehabilitation and how the wearable version of the device can facilitate transfer of gains from exercise to feared or demanding activities in real life. We conclude by discussing the implications of our findings on the design of technology for physical rehabilitation

    Bacteria Hunt: A multimodal, multiparadigm BCI game

    Get PDF
    Brain-Computer Interfaces (BCIs) allow users to control applications by brain activity. Among their possible applications for non-disabled people, games are promising candidates. BCIs can enrich game play by the mental and affective state information they contain. During the eNTERFACE’09 workshop we developed the Bacteria Hunt game which can be played by keyboard and BCI, using SSVEP and relative alpha power. We conducted experiments in order to investigate what difference positive vs. negative neurofeedback would have on subjects’ relaxation states and how well the different BCI paradigms can be used together. We observed no significant difference in mean alpha band power, thus relaxation, and in user experience between the games applying positive and negative feedback. We also found that alpha power before SSVEP stimulation was significantly higher than alpha power during SSVEP stimulation indicating that there is some interference between the two BCI paradigms
    corecore